Testing independence in nonparametric regression
نویسندگان
چکیده
منابع مشابه
Tests for Independence in Nonparametric Regression
Consider the nonparametric regression model Y = m(X) + ε, where the function m is smooth, but unknown. We construct tests for the independence of ε and X, based on n independent copies of (X, Y ). The testing procedures are based on differences of neighboring Y ’s. We establish asymptotic results for the proposed tests statistics, investigate their finite sample properties through a simulation ...
متن کاملNonparametric testing of conditional independence
Conditional independence of Y and Z given X holds if and only if the following two conditions hold: • CI1: The expected conditional covariance between arbitrary functions of Y and Z given X is zero (where the expectation is taken with respect to X). • CI2: The conditional covariance between arbitrary functions of Y and Z given X does not depend on X. Based on this decomposition, we propose a si...
متن کاملTests for independence in nonparametric regression ( supplement )
Proof of (2.17) From (2.10) we have with high probability for large n and uniformly in x and y √ n(F n (x, y) − ˆ F X (x) ˆ G(y)) ≤ α n x, y + log 2 n n − G(y)α n (x, ∞) − ˆ F X (x) α n ∞, y − log 2 n n + 2C log 2 n √ n , √ n(F n (x, y) − ˆ F X (x) ˆ G(y)) ≥ α n x, y − log 2 n n − G(y)α n (x, ∞) − ˆ F X (x) α n ∞, y + log 2 n n − 2C log 2 n √ n. Set V n,0 = √ n(F n − ˆ F X ˆ G). From (2.12) and...
متن کاملNonparametric Independence Testing for Small Sample Sizes
This paper deals with the problem of nonparametric independence testing, a fundamental decisiontheoretic problem that asks if two arbitrary (possibly multivariate) random variables X,Y are independent or not, a question that comes up in many fields like causality and neuroscience. While quantities like correlation of X,Y only test for (univariate) linear independence, natural alternatives like ...
متن کاملNonparametric independence testing via mutual information
We propose a test of independence of two multivariate random vectors, given a sample from the underlying population. Our approach, which we call MINT, is based on the estimation of mutual information, whose decomposition into joint and marginal entropies facilitates the use of recently-developed efficient entropy estimators derived from nearest neighbour distances. The proposed critical values,...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Journal of Multivariate Analysis
سال: 2009
ISSN: 0047-259X
DOI: 10.1016/j.jmva.2009.01.012